
Whenever we read or understand entropy, we tend to generalize it as a tendency towards disorder or chaos. Although the idea in itself is not entirely wrong, one of the (other) most insightful ways of looking at it is, as the tendency of energy to spread out over time. This fundamental concept governs nearly everything, right from tiny molecular collisions to vast cosmic events, from the beginning of the universe to its eventual end. Btw, here is an engaging video on youtube if you want to explore it further – The Most Misunderstood Concept in Physics.
So, the next obvious question is, why does energy spread out, in the first place?
Let’s take an example, imagine a garden with several small ponds. If a few fish gather in one pond, over time, they will swim into other ponds, spreading out naturally. It’s very unlikely that all the fish will suddenly swim back into just one pond on their own, leaving the others empty.
Similarly, for everyday systems with trillions of atoms and energy packets, the chance of energy spontaneously un-spreading is practically zero, it literally never happens, which is why we hardly observe it.
Is Your Air Conditioner Fighting Entropy, or Feeding It?
This principle also explains seemingly counter-intuitive phenomena, like how air conditioning works to cool a house. When we use AC inside our homes, we’re making the inside less disordered (more organized), which seems like it reduces entropy inside the house. But, to do that, the air conditioner uses energy from a power plant. The power plant burns fuel or uses electricity, which creates a lot of energy spread out as heat, this is a big increase in entropy in a different place.
Extending the equation, our very existence on Earth is a proof of this constant flow of entropy. The Earth is not a closed system, it continuously receives a “steady stream of low entropy” from the sun, which we also call as energy. While we absorb that energy, we’re also constantly radiating almost the same amount back into space.
Life, from plants to animals, captures this low entropy energy and uses it to grow, maintain bodies, and move, and in doing so, the energy becomes more spread out.
Did Life Evolve Just to Speed Up Entropy?
Some ideas suggest that life might have developed because it helps speed up this natural tendency to turn ordered states into more disordered ones.
Here’s how: The universe started out very hot and full of energy. As it expanded and cooled down, pieces of matter stuck together to make stars, planets, and galaxies. When these objects formed, their gravity pulled particles together, turning stored-up energy into movement (kinetic energy). When particles collided, their movements turned into heat. This process made the universe more disorganized (higher entropy). Over a long time, the energy that could do useful work spread out and became less concentrated, making it harder to use for anything useful.
Rocket Propulsion, a Lesson in Entropy
As I was letting my brain marinate in it, trying to make sense of the whole thing, I kept circling back to rockets. There’s something about how they work that lines up with all this talk about energy and entropy.
A rocket starts with tightly packed chemical energy in the fuel tanks. Once it burns, the exhaust gases shoot out the back at incredible speed. Due to which, the rocket moves upward, but you never get all that energy back. Some goes into heat, some into random motion you can’t use. And this wasted part is what physicists call entropy.

Does Entropy Have to Do with a Model Picking Words?
Now, let’s go a step further, extend the equation or swap the variables as follows:
- Rocket = A large language model
- Fuel = All the patterns it’s stored from training
We give LLM a prompt and it starts generating text. Now, this part, caught my attention:
LLMs also deal with entropy not the thermodynamic one but statistical entropy.
In language models, we can think about entropy in terms of how spread out probability becomes across possible outputs.
Let’s say you give a model a prompt like:
“The sun is a…”
The model has to decide what comes next. It might assign a 40% chance to “star”, 20% to “giant”, 10% to “source”, and so on. The way these chances are spread out is called a distribution.
Entropy measures how uncertain the model is about its choice. If one word has a high chance (like 40%) and the others are very unlikely, then the model is “sure” about its choice, which means low entropy.
If the chances are more evenly spread out among many words, the model is less sure, and the entropy is higher.
In that way, the model mirrors what happens in a rocket, of course not physically, but conceptually. Both start with something focused and concentrated: fuel in one case, a prompt in the other. And both result in something more spread out. In a rocket, it’s energy scattering as heat and motion. In an LLM, it’s probability spreading across a space of possible words.
Takeaway
All these concepts, atoms in a pond, heat from an air conditioner, propulsion systems in rockets or words in a language model, everything starts with something focused and ends up dispersed, more uncertain, more spread out. This reflects a core reality, which is, systems naturally move from order to disorder, from certainty to uncertainty.
Understanding this shift from order to disorder isn’t just a simple fact, it’s an opportunity to rethink how we approach problems.
Instead of trying to stop or fight natural trends toward disorder, we can learn to work with them. By understanding how things naturally spread out or become less orderly, we can create tools and systems that use energy, information, or other resources more effectively. It’s more about guiding the process so things spread out in useful directions instead of just letting them scatter everywhere randomly.
As Frederic Keffer once said,
“The future belongs to those who can manipulate entropy; those who understand but energy will be only accountants”.